Probabilistic learning vector quantization on manifold of symmetric positive definite matrices

نویسندگان

چکیده

In this paper, we develop a new classification method for manifold-valued data in the framework of probabilistic learning vector quantization. many scenarios, can be naturally represented by symmetric positive definite matrices, which are inherently points that live on curved Riemannian manifold. Due to non-Euclidean geometry manifolds, traditional Euclidean machine algorithms yield poor results such data. generalize quantization algorithm living manifold matrices equipped with natural metric (affine-invariant metric). By exploiting induced distance, derive space algorithm, obtaining rule through gradient descent. Empirical investigations synthetic data, image , and motor imagery electroencephalogram (EEG) demonstrate superior performance proposed method.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Learning general Gaussian kernel hyperparameters of SVMs using optimization on symmetric positive-definite matrices manifold

We propose a new method for general Gaussian kernel hyperparameter optimization for support vector machines classification. The hyperparameters are constrained to lie on a differentiable manifold. The proposed optimization technique is based on a gradient-like descent algorithm adapted to the geometrical structure of the manifold of symmetric positive-definite matrices. We compare the performan...

متن کامل

Deep Manifold Learning of Symmetric Positive Definite Matrices with Application to Face Recognition

In this paper, we aim to construct a deep neural network which embeds high dimensional symmetric positive definite (SPD) matrices into a more discriminative low dimensional SPD manifold. To this end, we develop two types of basic layers: a 2D fully connected layer which reduces the dimensionality of the SPD matrices, and a symmetrically clean layer which achieves non-linear mapping. Specificall...

متن کامل

Supervised LogEuclidean Metric Learning for Symmetric Positive Definite Matrices

Metric learning has been shown to be highly effective to improve the performance of nearest neighbor classification. In this paper, we address the problem of metric learning for symmetric positive definite (SPD) matrices such as covariance matrices, which arise in many real-world applications. Naively using standard Mahalanobis metric learning methods under the Euclidean geometry for SPD matric...

متن کامل

Emotion Recognition by Body Movement Representation on the Manifold of Symmetric Positive Definite Matrices

Emotion recognition is attracting great interest for its potential application in a multitude of real-life situations. Much of the Computer Vision research in this field has focused on relating emotions to facial expressions, with investigations rarely including more than upper body. In this work, we propose a new scenario, for which emotional states are related to 3D dynamics of the whole body...

متن کامل

Riemannian Metric Learning for Symmetric Positive Definite Matrices

Over the past few years, symmetric positive definite matrices (SPD) have been receiving considerable attention from computer vision community. Though various distance measures have been proposed in the past for comparing SPD matrices, the two most widely-used measures are affine-invariant distance and log-Euclidean distance. This is because these two measures are true geodesic distances induced...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Neural Networks

سال: 2021

ISSN: ['1879-2782', '0893-6080']

DOI: https://doi.org/10.1016/j.neunet.2021.04.024